Mutual information, metric entropy and cumulative relative entropy risk

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information Theory 4.1 Entropy and Mutual Information

Neural encoding and decoding focus on the question: " What does the response of a neuron tell us about a stimulus ". In this chapter we consider a related but different question: " How much does the neural response tell us about a stimulus ". The techniques of information theory allow us to answer this question in a quantitative manner. Furthermore, we can use them to ask what forms of neural r...

متن کامل

Mutual information challenges entropy bounds

We consider some formulations of the entropy bounds at the semiclassical level. The entropy S(V ) localized in a region V is divergent in quantum field theory (QFT). Instead of it we focus on the mutual information I(V,W ) = S(V ) + S(W ) − S(V ∪W ) between two different non-intersecting sets V and W . This is a low energy quantity, independent of the regularization scheme. In addition, the mut...

متن کامل

Mutual information is copula entropy

In information theory, mutual information (MI) is a difference concept with entropy.[1] In this paper, we prove with copula [2] that they are essentially same – mutual information is also a kind of entropy, called copula entropy. Based on this insightful result, We propose a simple method for estimating mutual information. Copula is a theory on dependence and measurement of association.[2] Skla...

متن کامل

Information Distances versus Entropy Metric

Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected val...

متن کامل

Some Results on Weighted Cumulative Entropy

Considering Rao et al. (2004) and Di Crescenzo and Longobardi (2009) studies, Misagh et al. (2011) proposed a weighted information which is based on the cumulative entropy called Weighted Cumulative Entropy (WCE). The above-mentioned model is a Shiftdependent Uncertainty Measure. In this paper, we examine some of the properties of WCE and obtain some bounds for that. In order to ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 1997

ISSN: 0090-5364

DOI: 10.1214/aos/1030741081